An Efficient Modified Derivative-Free Method on Bound Constrained Optimization

نویسندگان

  • Xiaoli Zhang
  • Qinghua Zhou
چکیده

This paper introduces an efficient modified derivative-free method for bound constrained optimization problems. It is based on the coordinate search method. During the running of the algorithm, it incorporates the progressive obtained local information into the current iteration. Actually, after we find two different suitable descent directions, we introduce the composite expansion step. By doing these, a new point is produced through some kind of line search techniques. Then we test the efficiency of our new method on the benchmarking. The computational results shows the efficiency of the modified algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

Constrained Production Optimization with an Emphasis on Derivative-free Methods

Production optimization involves the determination of optimum well controls to maximize an objective function such as cumulative oil production or net present value. In practice, this problem additionally requires the satisfaction of physical and economic constraints. Thus the overall problem represents a challenging nonlinearly constrained optimization. This work entails a comparative study of...

متن کامل

A Derivative-free Method for Linearly Constrained Nonsmooth Optimization

This paper develops a new derivative-free method for solving linearly constrained nonsmooth optimization problems. The objective functions in these problems are, in general, non-regular locally Lipschitz continuous function. The computation of generalized subgradients of such functions is difficult task. In this paper we suggest an algorithm for the computation of subgradients of a broad class ...

متن کامل

A Linesearch-Based Derivative-Free Approach for Nonsmooth Constrained Optimization

In this paper, we propose new linesearch-based methods for nonsmooth constrained optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequal...

متن کامل

An active-set trust-region method for derivative-free nonlinear bound-constrained optimization

We consider an implementation of a recursive model-based active-set trust-region method for solving bound-constrained nonlinear non-convex optimization problems without derivatives using the technique of self-correcting geometry proposed in [24]. Considering an active-set method in modelbased optimization creates the opportunity of saving a substantial amount of function evaluations when mainta...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014